THE SPECTRAL NORM OF RANDOM INNER-PRODUCT KERNEL MATRICES By
نویسندگان
چکیده
We study the spectra of p×p random matrices K with off-diagonal (i, j) entry equal to n−1/2k(XT i Xj/n ), where Xi’s are the rows of a p× n matrix with i.i.d. entries and k is a scalar function. It is known that under mild conditions, as n and p increase proportionally, the empirical spectral measure of K converges to a deterministic limit μ. We prove that if k is a polynomial and the distribution of entries of Xi is symmetric and satisfies a general moment bound, then K is the sum of two components, the first with spectral norm converging to ‖μ‖ (the maximum absolute value of the support of μ) and the second a perturbation of rank at most two. In certain cases, including when k is an odd polynomial function, the perturbation is 0 and the spectral norm ‖K‖ converges to ‖μ‖. If the entries of Xi are Gaussian, we also prove that ‖K‖ converges to ‖μ‖ for a large class of odd non-polynomial functions k. In general, the perturbation may contribute spike eigenvalues to K outside of its limiting support, and we conjecture that they have deterministic limiting locations as predicted by a deformed GUE model. Our study of such matrices is motivated by the analysis of statistical thresholding procedures to estimate sparse covariance matrices from multivariate data, and our results imply an asymptotic approximation to the spectral norm error of such procedures when the population covariance is the identity.
منابع مشابه
Spectral Norm of Random Kernel Matrices with Applications to Privacy
Kernel methods are an extremely popular set of techniques used for many important machine learning and data analysis applications. In addition to having good practical performance, these methods are supported by a well-developed theory. Kernel methods use an implicit mapping of the input data into a high dimensional feature space defined by a kernel function, i.e., a function returning the inne...
متن کاملCartesian decomposition of matrices and some norm inequalities
Let X be an n-square complex matrix with the Cartesian decomposition X = A + i B, where A and B are n times n Hermitian matrices. It is known that $Vert X Vert_p^2 leq 2(Vert A Vert_p^2 + Vert B Vert_p^2)$, where $p geq 2$ and $Vert . Vert_p$ is the Schatten p-norm. In this paper, this inequality and some of its improvements ...
متن کاملFuzzy Inner Product and Fuzzy Norm \of Hyperspaces
We introduce and study fuzzy (co-)inner product and fuzzy(co-)norm of hyperspaces. In this regard by considering the notionof hyperspaces, as a generalization of vector spaces, first we willintroduce the notion of fuzzy (co-)inner product in hyperspaces and will apply it to formulate the notions offuzzy (co-)norm and fuzzy (co-)orthogonality in hyperspaces. Inparticular, we will prove that ...
متن کاملThe Spectrum of Random Inner-product Kernel Matrices
Abstract: We consider n-by-n matrices whose (i, j)-th entry is f(XT i Xj), where X1, . . . ,Xn are i.i.d. standard Gaussian random vectors in Rp, and f is a real-valued function. The eigenvalue distribution of these random kernel matrices is studied at the “large p, large n” regime. It is shown that, when p, n → ∞ and p/n = γ which is a constant, and f is properly scaled so that V ar(f(XT i Xj)...
متن کاملA Comparative Study of Fuzzy Inner Product Spaces
In the present paper, we investigate a connection between two fuzzy inner product one of which arises from Felbin's fuzzy norm and the other is based on Bag and Samanta's fuzzy norm. Also we show that, considering a fuzzy inner product space, how one can construct another kind of fuzzy inner product on this space.
متن کامل